First Notes on Maximum Entropy Entailment for Quantified Implications

نویسنده

  • Francesco Kriegel
چکیده

Entropy is a measure for the uninformativeness or randomness of a data set, i.e., the higher the entropy is, the lower is the amount of information. In the field of propositional logic it has proven to constitute a suitable measure to be maximized when dealing with models of probabilistic propositional theories. More specifically, it was shown that the model of a probabilistic propositional theory with maximal entropy allows for the deduction of other formulae which are somehow expected by humans, i.e., allows for some kind of common sense reasoning. In order to pull the technique of maximum entropy entailment to the field of Formal Concept Analysis, we define the notion of entropy of a formal context with respect to the frequency of its object intents, and then define maximum entropy entailment for quantified implication sets, i.e., for sets of partial implications where each implication has an assigned degree of confidence. Furthermore, then this entailment technique is utilized to define so-called maximum entropy implicational bases (ME-bases), and a first general example of such a ME-base is provided.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Connecting Lexicographic with Maximum Entropy Entailment

This paper reviews and relates two default reasoning mechanisms, lexicographic (lex) and maximum entropy (me) entailment. Meentailment requires that defaults be assigned speci c strengths and it is shown that lex-entailment can be equated to me-entailment for a class of speci c strength assignments. By clarifying the assumptions which underlie lex-entailment, it is argued that me-entailment is ...

متن کامل

Combining probabilistic logic programming with the power of maximum entropy

This paper is on the combination of two powerful approaches to uncertain reasoning: logic programming in a probabilistic setting, on the one hand, and the information-theoretical principle of maximum entropy, on the other hand. More precisely, we present two approaches to probabilistic logic programming under maximum entropy. The first one is based on the usual notion of entailment under maximu...

متن کامل

Nilsson's Probabilistic Entailment Extended to Dempster-Shafer Theory

Probabilistfc logic has been discussed in a recent paper by N. Nilsson [12]. An ·entailment scheme is proposed which can predict the probability of an event when the probabilities of certain other connected events are known. This scheme involves the use of a maximum entropy method proposed by P. Cheeseman in [3]. The model uses vectors which represent certain possible states of the world. Only ...

متن کامل

Taylor Expansion for the Entropy Rate of Hidden Markov Chains

We study the entropy rate of a hidden Markov process, defined by observing the output of a symmetric channel whose input is a first order Markov process. Although this definition is very simple, obtaining the exact amount of entropy rate in calculation is an open problem. We introduce some probability matrices based on Markov chain's and channel's parameters. Then, we try to obtain an estimate ...

متن کامل

Uncertain Satisfiability and Uncertain Entailment

Uncertain logic is a branch of multi-valued logic for dealing with subjective information, which explains formulas as uncertain variables and defines their truth values as uncertain measures. In this paper, uncertain satisfiability is proposed to verify the consistency of a set of truth values for the formulas. Furthermore, for a set of satisfiable formulas with given truth values, the problem ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017